441 research outputs found

    On Using Gait Biometrics to Enhance Face Pose Estimation

    No full text
    Many face biometrics systems use controlled environments where subjects are viewed directly facing the camera. This is less likely to occur in surveillance environments, so a process is required to handle the pose variation of the human head, change in illumination, and low frame rate of input image sequences. This has been achieved using scale invariant features and 3D models to determine the pose of the human subject. Then, a gait trajectory model is generated to obtain the correct the face region whilst handing the looming effect. In this way, we describe a new approach aimed to estimate accurate face pose. The contributions of this research include the construction of a 3D model for pose estimation from planar imagery and the first use of gait information to enhance the face pose estimation process

    On using gait to enhance frontal face extraction

    No full text
    Visual surveillance finds increasing deployment formonitoring urban environments. Operators need to be able to determine identity from surveillance images and often use face recognition for this purpose. In surveillance environments, it is necessary to handle pose variation of the human head, low frame rate, and low resolution input images. We describe the first use of gait to enable face acquisition and recognition, by analysis of 3-D head motion and gait trajectory, with super-resolution analysis. We use region- and distance-based refinement of head pose estimation. We develop a direct mapping to relate the 2-D image with a 3-D model. In gait trajectory analysis, we model the looming effect so as to obtain the correct face region. Based on head position and the gait trajectory, we can reconstruct high-quality frontal face images which are demonstrated to be suitable for face recognition. The contributions of this research include the construction of a 3-D model for pose estimation from planar imagery and the first use of gait information to enhance the face extraction process allowing for deployment in surveillance scenario

    Audio-Visual Spatial Integration and Recursive Attention for Robust Sound Source Localization

    Full text link
    The objective of the sound source localization task is to enable machines to detect the location of sound-making objects within a visual scene. While the audio modality provides spatial cues to locate the sound source, existing approaches only use audio as an auxiliary role to compare spatial regions of the visual modality. Humans, on the other hand, utilize both audio and visual modalities as spatial cues to locate sound sources. In this paper, we propose an audio-visual spatial integration network that integrates spatial cues from both modalities to mimic human behavior when detecting sound-making objects. Additionally, we introduce a recursive attention network to mimic human behavior of iterative focusing on objects, resulting in more accurate attention regions. To effectively encode spatial information from both modalities, we propose audio-visual pair matching loss and spatial region alignment loss. By utilizing the spatial cues of audio-visual modalities and recursively focusing objects, our method can perform more robust sound source localization. Comprehensive experimental results on the Flickr SoundNet and VGG-Sound Source datasets demonstrate the superiority of our proposed method over existing approaches. Our code is available at: https://github.com/VisualAIKHU/SIRA-SSLComment: Camera-Ready, ACM MM 202

    Estimation of 3D head region using gait motion for surveillance video

    Full text link
    Detecting and recognizing people is important in surveillance. Many detection approaches use local information, such as pattern and colour, which can lead to constraints on application such as changes in illumination, low resolution, and camera view point. In this paper we propose a novel method for estimating the 3D head region based on analysing the gait motion derived from the video provided by a single camera. Generally, when a person walks there is known head movement in the vertical direction, regardless of the walking direction. Using this characteristic the gait period is detected using wavelet decomposition and the heel strike position is calculated in 3D space. Then, a 3D gait trajectory model is constructed by non-linear optimization. We evaluate our new approach using the CAVIAR database and show that we can indeed determine the head region to good effect. The contributions of this research include the first use of detecting a face region by using human gait and which has fewer application constraints than many previous approaches

    Tau functions as Widom constants

    Full text link
    We define a tau function for a generic Riemann-Hilbert problem posed on a union of non-intersecting smooth closed curves with jump matrices analytic in their neighborhood. The tau function depends on parameters of the jumps and is expressed as the Fredholm determinant of an integral operator with block integrable kernel constructed in terms of elementary parametrices. Its logarithmic derivatives with respect to parameters are given by contour integrals involving these parametrices and the solution of the Riemann-Hilbert problem. In the case of one circle, the tau function coincides with Widom's determinant arising in the asymptotics of block Toeplitz matrices. Our construction gives the Jimbo-Miwa-Ueno tau function for Riemann-Hilbert problems of isomonodromic origin (Painlev\'e VI, V, III, Garnier system, etc) and the Sato-Segal-Wilson tau function for integrable hierarchies such as Gelfand-Dickey and Drinfeld-Sokolov.Comment: 26 pages, 6 figure

    Heel strike detection based on human walking movement for surveillance analysis

    Get PDF
    Heel strike detection is an important cue for human gait recognition and detection in visual surveillance since the heel strike position can be used to derive the gait periodicity, stride and step length. We propose a novel method for heel strike detection using a gait trajectory model, which is robust to occlusion, camera view, and low resolution. When a person walks, the movement of the head is conspicuous and sinusoidal. The highest point of the trajectory of the head occurs when the feet cross (stance) and the lowest point is when the gait stride is the largest (heel strike). Our gait trajectory model is constructed from trajectory data using non-linear optimisation. Then, the key frames in which the heel strikes take place are calculated. A Region Of Interest (ROI) is extracted using the silhouette image of the key frame as a filter. For candidate detection, Gradient Descent is applied to detect maxima which are considered to be the time of the heel strikes. For candidate verification, two filtering methods are used to reconstruct the 3D position of a heel strike using the given camera projection matrix. The contribution of this research is the first use of the gait trajectory in the heel strike position estimation process and we contend that it is a new approach for basic analysis in surveillance imagery

    Two-gap and paramagnetic pair-breaking effects on upper critical field of SmFeAsO0.85_{0.85} and SmFeAsO0.8_{0.8}F0.2_{0.2} single crystals

    Full text link
    We investigated the temperature dependence of the upper critical field [Hc2(T)H_{c2}(T)] of fluorine-free SmFeAsO0.85_{0.85} and fluorine-doped SmFeAsO0.8_{0.8}F0.2_{0.2} single crystals by measuring the resistive transition in low static magnetic fields and in pulsed fields up to 60 T. Both crystals show that Hc2(T)H_{c2}(T)'s along the c axis [Hc2c(T)H_{c2}^c(T)] and in an abab-planar direction [Hc2ab(T)H_{c2}^{ab}(T)] exhibit a linear and a sublinear increase, respectively, with decreasing temperature below the superconducting transition. Hc2(T)H_{c2}(T)'s in both directions deviate from the conventional one-gap Werthamer-Helfand-Hohenberg theoretical prediction at low temperatures. A two-gap nature and the paramagnetic pair-breaking effect are shown to be responsible for the temperature-dependent behavior of Hc2cH_{c2}^c and Hc2abH_{c2}^{ab}, respectively.Comment: 21 pages, 8 figure
    • 

    corecore